if you're using the huggingface/transformers ecosystem and not downloading your model checkpoints in parallel: you may find it usefu to `from huggingface_hub import snapshot_download` and then set some variable, say `x`, to `snapshot_download("hf/repo-id")` if you have your HF_HOME env variable set (`import os; os.environ['HF_HOME'] = 'location_here'`) it should be fine but if you dont, specify `cache_dir='location_here'` in the snapshot_download call `x` will be a string containing your folder location which you can pass to AutoModelForWhateverTheGunk you can even get lazy and just wrap the model name like ```python model = AutoModelForWhoGivesAGloop.from_pretrained(snapshot_download(args.model_name), **other_model_params) ``` but I haven't tested that on multiple ranks, you should be downloading on one rank anyway. ~ X_+